Two modified scaled nonlinear conjugate gradient methods

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Two new conjugate gradient methods based on modified secant equations

Following the approach proposed by Dai and Liao, we introduce two nonlinear conjugate gradient methods for unconstrained optimization problems. One of our proposedmethods is based on a modified version of the secant equation proposed by Zhang, Deng and Chen, and Zhang and Xu, and the other is based on the modified BFGS update proposed by Yuan. An interesting feature of our methods is their acco...

متن کامل

On nonlinear generalized conjugate gradient methods

where F (ξ) is a nonlinear operator from a real Euclidean space of dimension n or Hilbert space into itself. The Euclidean norm and corresponding inner product will be denoted by ‖·‖1 and (·, ·)1 respectively. A general different inner product with a weight function and the corresponding norm will be denoted by (·, ·)0 and ‖ · ‖ respectively. In the first part of this article (Sects. 2 and 3) w...

متن کامل

Convergence Properties of Nonlinear Conjugate Gradient Methods

Recently, important contributions on convergence studies of conjugate gradient methods have been made by Gilbert and Nocedal [6]. They introduce a “sufficient descent condition” to establish global convergence results, whereas this condition is not needed in the convergence analyses of Newton and quasi-Newton methods, [6] hints that the sufficient descent condition, which was enforced by their ...

متن کامل

A Survey of Nonlinear Conjugate Gradient Methods

This paper reviews the development of different versions of nonlinear conjugate gradient methods, with special attention given to global convergence properties.

متن کامل

Exploiting damped techniques for nonlinear conjugate gradient methods

In this paper we propose the use of damped techniques within Nonlinear Conjugate Gradient (NCG) methods. Damped techniques were introduced by Powell and recently reproposed by Al-Baali and till now, only applied in the framework of quasi–Newton methods. We extend their use to NCG methods in large scale unconstrained optimization, aiming at possibly improving the efficiency and the robustness of...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Computational and Applied Mathematics

سال: 2014

ISSN: 0377-0427

DOI: 10.1016/j.cam.2013.11.001